Async Processing
Asynchronous processing means the client sends a request, and instead of waiting for the task to finish, the server immediately responds with a task reference or status URL. The client can then poll or wait for notification when the task completes.
Why Async is Important for Performance & Scalability
- Non-blocking operations: The server does not hold resources while the task is running.
- Improved latency: The client gets a response immediately.
- Better resource utilization: Long-running tasks are handled in background workers or queues.
- Supports high throughput: Server can handle many requests concurrently.
Common Pattern: Task Queue
- Client sends request:
POST /api/reports
Content-Type: application/json
{
"reportType": "monthly_sales",
"format": "pdf"
}
- Server immediately responds:
HTTP/1.1 202 Accepted
Content-Type: application/json
Location: /api/reports/789/status
{
"taskId": "789",
"status": "pending"
}
- Client polls status:
GET /api/reports/789/status
- Server responds:
HTTP/1.1 200 OK
Content-Type: application/json
{
"taskId": "789",
"status": "completed",
"resultUrl": "/api/reports/789/download"
}
- Client downloads the result:
GET /api/reports/789/download
202 Accepted→ Server accepted the task but hasn’t completed it.Locationheader → Provides a status endpoint to check progress.- This pattern decouples client request from long-running processing, improving scalability.
Asynchronous Processing Design Tips
- Use task queues (e.g., RabbitMQ, Kafka, AWS SQS) for background work.
- Track task status:
pending,in_progress,completed,failed. - Return a
taskIdto allow polling or result retrieval. - Avoid synchronous blocking on long tasks.